A Majorized ADMM with Indefinite Proximal Terms for Linearly Constrained Convex Composite Optimization
نویسندگان
چکیده
This paper presents a majorized alternating direction method of multipliers (ADMM) with indefinite proximal terms for solving linearly constrained 2-block convex composite optimization problems with each block in the objective being the sum of a non-smooth convex function (p(x) or q(y)) and a smooth convex function (f(x) or g(y)), i.e., minx∈X , y∈Y{p(x) + f(x) + q(y) + g(y) | A∗x + B∗y = c}. By choosing the indefinite proximal terms properly, we establish the global convergence, and the iteration-complexity in the non-ergodic sense of the proposed method for the step-length τ ∈ (0, (1 + √ 5)/2). The computational benefit of using indefinite proximal terms within the ADMM framework instead of the current requirement of positive semidefinite ones is also demonstrated numerically. This opens up a new way to improve the practical performance of the ADMM and related methods.
منابع مشابه
On the Q-linear Convergence of a Majorized Proximal ADMM for Convex Composite Programming and Its Applications to Regularized Logistic Regression
This paper aims to study the convergence rate of a majorized alternating direction method of multiplier with indefinite proximal terms (iPADMM) for solving linearly constrained convex composite optimization problems. We establish the Q-linear rate convergence theorem for 2-block majorized iPADMM under mild conditions. Based on this result, the convergence rate analysis of symmetric Gaussian-Sei...
متن کاملSymmetric ADMM with Positive-Indefinite Proximal Regularization for Linearly Constrained Convex Optimization
The proximal ADMM which adds proximal regularizations to ADMM’s subproblems is a popular and useful method for linearly constrained separable convex problems, especially its linearized case. A well-known requirement on guaranteeing the convergence of the method in the literature is that the proximal regularization must be positive semidefinite. Recently it was shown by He et al. (Optimization O...
متن کاملInertial Proximal ADMM for Linearly Constrained Separable Convex Optimization
The alternating direction method of multipliers (ADMM) is a popular and efficient first-order method that has recently found numerous applications, and the proximal ADMM is an important variant of it. The main contributions of this paper are the proposition and the analysis of a class of inertial proximal ADMMs, which unify the basic ideas of the inertial proximal point method and the proximal ...
متن کاملLinear Rate Convergence of the Alternating Direction Method of Multipliers for Convex Composite Programming*
In this paper, we aim to prove the linear rate convergence of the alternating direction method of multipliers (ADMM) for solving linearly constrained convex composite optimization problems. Under a mild calmness condition, which holds automatically for convex composite piecewise linear-quadratic programming, we establish the global Q-linear rate of convergence for a general semi-proximal ADMM w...
متن کاملLinear Rate Convergence of the Alternating Direction Method of Multipliers for Convex Composite Quadratic and Semi-Definite Programming
In this paper, we aim to provide a comprehensive analysis on the linear rate convergence of the alternating direction method of multipliers (ADMM) for solving linearly constrained convex composite optimization problems. Under a certain error bound condition, we establish the global linear rate of convergence for a more general semi-proximal ADMM with the dual steplength being restricted to be i...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- SIAM Journal on Optimization
دوره 26 شماره
صفحات -
تاریخ انتشار 2016